Custom layer example

Custom layer object is used for adding custom layers to your Kanzi Studio project. You create the functionality of custom layers in the application code and publish them to be used in the Application Window by using Custom Preview.

This example demonstrates how to create a custom layer and render video frames with Kanzi using a custom layer implementation. For Android, a shared image texture utilizing OES_EGL_image_external extension is used for streaming the video from an external context.

The example is split into the following projects:

You can find the example in <KanziWorkspace>/Examples/Custom_layer.

The Kanzi Studio project uses an empty layer as the parent for the custom layer, which allows:

The custom layer example consists the following components:

  1. The Custom_layer component, which implements a KzuLayerClass.
  2. The example runs on Windows and Android platforms:
    1. The common implementation is provided by custom_layer.c, video_layer.h and video_layer.c.
    2. Windows platform utilizes Microsoft Media Foundation for video playback: <KanziWorkspace>/Examples/Custom_layer/Application/platforms/win32/src
    3. Android platform utilizes the native media player: <KanziWorkspace>/Examples/Custom_layer/Application/platforms/android/src

Using external texture handle

Because Android’s native code and Kanzi application framework run in different context, they cannot share the GPU resources. A widely available OpenGL extension (GL_TEXTURE_EXTERNAL_OES) provides support for external texture resources that can be shared between contexts with few limitations. In the Custom layer example, Android’s media player is used to render a video stream to a surface view that is bound to the external texture handle. On the Kanzi side, the video layer’s texture is bound to the same external texture handle, and displayed in the render loop.

The approach is extendable to platforms where:

A typical example is a streaming video.

The key integration points for the usage of the texture handle in the example are:

  1. Create a shared image bound to an external texture handle (integer ID) in your Kanzi process using
    kzuSharedImageTextureCreateExternal(const struct KzuResourceManager* resourceManager,
    	kzString name, kzUint width, kzUint height, void* userData,
    	kzUint externalTextureHandle,
    	kzuSharedImageTextureExternalTextureCreateFunction externalTextureCreator,
    	struct KzuSharedImageTexture** out_imageTexture)
  2. In the Android process that delivers the video service, generate a texture ID and bind that to the external texture target
    GLES20.glGenTextures(1, textures, mTextureID)
    LES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, mTextureID)
  3. Provide the texture target to the surface view and set the surface view for the media player
    mSurface = new SurfaceTexture(mTextureID);
    Surface surface = new Surface(mSurface);
    mMediaPlayer.setSurface(surface)
  4. Synchronize the video stream with Kanzi’s rendering loop. In the example, this is done in the custom video layer’s onRender callback by calling video update function on Android side.
    //KANZI
    KZ_CALLBACK static kzsError videoLayerRender(...)
    	...
    	videoUpdateVideoSurfaceTexture(videoLayer->video);
    //ANDROID
    // Flag that a new frame is available:
    synchronized public void onFrameAvailable(SurfaceTexture surface) {
    	updateSurface = true;
    }
    // Update the texture
    public float[] updateVideoSurfaceTexture()
    {
    	synchronized (this) {
    	if (updateSurface) {
    		mSurface.updateTexImage();
    		mSurface.getTransformMatrix(mSTMatrix);
    		updateSurface = false;
    	}
    	...
    }
  5. Define the extension and external texture use in the corresponding shader.
    In the Custom layer example, this is defined in the LayerDefault material type’s fragment shader:
    #extension GL_OES_EGL_image_external : require
    uniform samplerExternalOES kzTexture;

See also

Layers

Using image layers